
Building a Toolset to Calibrate Models from Heterogeneous Experimental Data: What are the Industrial Requirements?
Please login to view abstract download link
As data are becoming a key point in today’s sciences, experimental setups are becoming more and more complex trying to acquire as much data as possible. However, analysing large amounts of heterogenous data to calibrate industrial models is one of the next challenges. Before performing a calibration test, major questions should be answered. What kind and amount of data should be gathered and how to weight them to robustly evaluate quantities of interest (QOIs)? Modern tools, such as the EikoTwin software suite [1], are being built to help industrial players answer these questions. Through the virtualisation of the entire experimental setup, it is possible to conduct a pre-study starting from data acquisition up to QOI outputs (e.g., constitutive parameters). The calibration is, for instance, based on Finite Element Model Updating (FEMU), which is chosen for its reliability and non-intrusiveness. The measurement uncertainty propagation is firstly used in a Bayesian framework to properly weight heterogeneous measurements [2], and secondly to assess identifiability [3]. Moreover, an uncertainty weighted sensitivity analysis is conducted, allowing for the selection of the right amount of data to determine the QOI. This selection is a trade-off between uncertainty and amount of measured and simulated data. The example, namely, a real tensile test on a drilled dog-bone sample, will focus on data coming from images (via Digital Image Correlation), strain gauges and force sensor to calibrate parameters of a Ludwig’s elastoplastic constitutive law (with five unknown parameters). This case study will be used to illustrate how the EikoTwin software suite answers the question of the choice of relevant experimental data as well as the analysis of heterogenous data with respect to measurement uncertainties in order to calibrate a solid mechanics model.